A Modified Hybrid Partitioned VM-Method for Unconstrained Optimization

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A new hybrid conjugate gradient algorithm for unconstrained optimization

In this paper, a new hybrid conjugate gradient algorithm is proposed for solving unconstrained optimization problems. This new method can generate sufficient descent directions unrelated to any line search. Moreover, the global convergence of the proposed method is proved under the Wolfe line search. Numerical experiments are also presented to show the efficiency of the proposed algorithm, espe...

متن کامل

A New Scaled Hybrid Modified BFGS Algorithms for Unconstrained Optimization

The BFGS methods is a method to solve an unconstrained optimization. Many modification have been done for solving this problems. In this paper, we present a new scaled hybrid modified BFGS. The new scaled hybrid modified BFGS algorithms are proposed and analyzed. The scaled hybrid modified BFGS can improve the number of iterations. Results obtained by the hybrid modified BFGS algorithms are com...

متن کامل

A New Hybrid Conjugate Gradient Method Based on Eigenvalue Analysis for Unconstrained Optimization Problems

In this paper‎, ‎two extended three-term conjugate gradient methods based on the Liu-Storey ({tt LS})‎ ‎conjugate gradient method are presented to solve unconstrained optimization problems‎. ‎A remarkable property of the proposed methods is that the search direction always satisfies‎ ‎the sufficient descent condition independent of line search method‎, ‎based on eigenvalue analysis‎. ‎The globa...

متن کامل

A Modified Regularized Newton Method for Unconstrained Nonconvex Optimization

In this paper, we present a modified regularized Newton method for the unconstrained nonconvex optimization by using trust region technique. We show that if the gradient and Hessian of the objective function are Lipschitz continuous, then the modified regularized Newton method (M-RNM) has a global convergence property. Numerical results show that the algorithm is very efficient.

متن کامل

A Modified Conjugate Gradient Method for Unconstrained Optimization

Conjugate gradient methods are an important class of methods for solving unconstrained optimization problems, especially for large-scale problems. Recently, they have been studied in depth. In this paper, we further study the conjugate gradient method for unconstrained optimization. We focus our attention to the descent conjugate gradient method. This paper presents a modified conjugate gradien...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Kirkuk University Journal-Scientific Studies

سال: 2012

ISSN: 2616-6801

DOI: 10.32894/kujss.2012.45300